Bayesian Optimization with Expensive Integrands

نویسندگان

چکیده

Nonconvex derivative-free time-consuming objectives are often optimized using “black-box” optimization. These approaches assume very little about the objective. While broadly applicable, they typically require more evaluations than methods exploiting problem structure. Often, such actually sum or integral of a larger number functions, each which consumes significant time when evaluated individually. This arises in designing aircraft, choosing parameters ride-sharing dispatch systems, and tuning hyperparameters deep neural networks. We develop novel Bayesian optimization algorithm that leverages this structure to improve performance. Our is average-case optimal by construction single evaluation integrand remains within our budget. Achieving one-step optimality requires solving challenging value information problem, for we provide efficient discretization-free computational method. also prove consistency method both continuum discrete finite domains objective functions sums. In numerical experiments comparing against previous state-of-the-art methods, including those leverage structure, performs as well better across wide range problems offers improvements noisy varies smoothly integrated variables.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bayesian Optimization with Expensive Integrands

We propose a Bayesian optimization algorithm for objective functions that are sums or integrals of expensive-to-evaluate functions, allowing noisy evaluations. These objective functions arise in multi-task Bayesian optimization for tuning machine learning hyperparameters, optimization via simulation, and sequential design of experiments with random environmental conditions. Our method is averag...

متن کامل

Parallel Bayesian Global Optimization of Expensive Functions

We consider parallel global optimization of derivative-free expensive-to-evaluate functions, and proposes an efficient method based on stochastic approximation for implementing a conceptual Bayesian optimization algorithm proposed by [10]. To accomplish this, we use infinitessimal perturbation analysis (IPA) to construct a stochastic gradient estimator and show that this estimator is unbiased.

متن کامل

Bayesian Monte Carlo for the Global Optimization of Expensive Functions

In the last decades enormous advances have been made possible for modelling complex (physical) systems by mathematical equations and computer algorithms. To deal with very long running times of such models a promising approach has been to replace them by stochastic approximations based on a few model evaluations. In this paper we focus on the often occuring case that the system mod­ elled has t...

متن کامل

Bayesian Monte Carlo for the Global Optimization of Expensive Functions

In the last decades enormous advances have been made possible for modelling complex (physical) systems by mathematical equations and computer algorithms. To deal with very long running times of such models a promising approach has been to replace them by stochastic approximations based on a few model evaluations. In this paper we focus on the often occuring case that the system mod­ elled has t...

متن کامل

Bayesian Monte Carlo for the Global Optimization of Expensive Functions

In the last decades enormous advances have been made possible for modelling complex (physical) systems by mathematical equations and computer algorithms. To deal with very long running times of such models a promising approach has been to replace them by stochastic approximations based on a few model evaluations. In this paper we focus on the often occuring case that the system mod­ elled has t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Siam Journal on Optimization

سال: 2022

ISSN: ['1095-7189', '1052-6234']

DOI: https://doi.org/10.1137/19m1303125